A No-Free-Lunch theorem for non-uniform distributions of target functions

نویسندگان

  • Christian Igel
  • Marc Toussaint
چکیده

The sharpened No-Free-Lunch-theorem (NFL-theorem) states that, regardless of the performance measure, the performance of all optimization algorithms averaged uniformly over any finite set F of functions is equal if and only if F is closed under permutation (c.u.p.). In this paper, we first summarize some consequences of this theorem, which have been proven recently: The number of subsets c.u.p. can be neglected compared to the total number of possible subsets. In particular, problem classes relevant in practice are not likely to be c.u.p. The average number of evaluations needed to find a desirable (e.g., optimal) solution can be calculated independent of the optimization algorithm in certain scenarios. Second, as the main result, the NFL-theorem is extended. Necessary and sufficient conditions for NFL-results to hold are given for arbitrary distributions of target functions. This yields the most general NFL-theorem for optimization presented so far. Mathematics Subject Classifications (2000): 90C27, 68T20.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Recent Results on No-Free-Lunch Theorems for Optimization

The sharpened No-Free-Lunch-theorem (NFL-theorem) states that the performance of all optimization algorithms averaged over any finite set F of functions is equal if and only if F is closed under permutation (c.u.p.) and each target function in F is equally likely. In this paper, we first summarize some consequences of this theorem, which have been proven recently: The average number of evaluati...

متن کامل

Two Broad Classes of Functions for Which a No Free Lunch Result Does Not Hold

We identify classes of functions for which a No Free Lunch result does and does not hold, with particular emphasis on the relationship between No Free Lunch and problem description length. We show that a NFL result does not apply to a set of functions when the description length of the functions is sufficiently bounded. We consider sets of functions with non-uniform associated probability distr...

متن کامل

No Free Lunch for Noise Prediction

No-free-lunch theorems have shown that learning algorithms cannot be universally good. We show that no free funch exists for noise prediction as well. We show that when the noise is additive and the prior over target functions is uniform, a prior on the noise distribution cannot be updated, in the Bayesian sense, from any finite data set. We emphasize the importance of a prior over the target f...

متن کامل

On the Structure of Sequential Search: Beyond "No Free Lunch"

In sequential, deterministic, non-redundant search the algorithm permutes a test function to obtain the search result. The mapping from test functions to search results is a one-to-one correspondence. There is a partition of the set of functions into maximal blocks of functions that are permutations of one another. This permits reasoning about search to be reduced to reasoning about search with...

متن کامل

Boolean Functions Fitness Spaces

We investigate the distribution of performance of the Boolean functions of 3 Boolean inputs (particularly that of the parity functions), the always-on-6 and even-6 parity functions. We us enumeration, uniform Monte-Carlo random sampling and sampling random full trees. As expected XOR dramatically changes the fitness distributions. In all cases once some minimum size threshold has been exceeded,...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • J. Math. Model. Algorithms

دوره 3  شماره 

صفحات  -

تاریخ انتشار 2004